Goto

Collaborating Authors

 tech giant bring ml inference


Tech Giants Bring ML Inference On Edge With AI-based Chips

#artificialintelligence

The graph of emerging technology is growing at an exponential rate and enterprises are rushing to adopt the latest trends. From cars to healthcare, AI has proven its business applicability. In this article, we discuss how leading tech giants use AI-based inference chips in the next step for evolving mobile devices. Smartphone manufacturers are now integrating faster AI capabilities in the devices, right from the user interface to the apps that people are using in their smartphones. In 2017, prominent chipmaker introduced its new Movidius Myriad X vision processing unit (VPU), advancing Intel's end-to-end portfolio of AI solutions to deliver more autonomous capabilities across a wide range of product categories including drones, robotics, smart cameras, and virtual reality. This chip has a dedicated Neural Compute Engine for accelerating deep learning inferences at the edge and is designed to run deep neural networks at a high-speed and low power without any loss of accuracy.